Approximate invariant subspaces and quasi-newton optimization methods

نویسندگان

  • Serge Gratton
  • Philippe L. Toint
چکیده

New approximate secant equations are shown to result from the knowledge of (problem dependent) invariant subspace information, which in turn suggests improvements in quasi-Newton methods for unconstrained minimization. A new limitedmemory BFGS using approximate secant equations is then derived and its encouraging behaviour illustrated on a small collection of multilevel optimization examples. The smoothing properties of this algorithm are considered next, and automatic generation of approximate eigenvalue information demonstrated. The use of this information for improving algorithmic performance is finally investigated on the same multilevel examples.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Reduced-Hessian Quasi-Newton Methods for Unconstrained Optimization

Quasi-Newton methods are reliable and efficient on a wide range of problems, but they can require many iterations if the problem is ill-conditioned or if a poor initial estimate of the Hessian is used. In this paper, we discuss methods designed to be more efficient in these situations. All the methods to be considered exploit the fact that quasi-Newton methods accumulate approximate second-deri...

متن کامل

Limited-Memory Reduced-Hessian Methods for Large-Scale Unconstrained Optimization

Limited-memory BFGS quasi-Newton methods approximate the Hessian matrix of second derivatives by the sum of a diagonal matrix and a fixed number of rank-one matrices. These methods are particularly effective for large problems in which the approximate Hessian cannot be stored explicitly. It can be shown that the conventional BFGS method accumulates approximate curvature in a sequence of expandi...

متن کامل

A Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems

In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...

متن کامل

Weak*-closed invariant subspaces and ideals of semigroup algebras on foundation semigroups

Let S be a locally compact foundation semigroup with identity and                          be its semigroup algebra. Let X be a weak*-closed left translation invariant subspace of    In this paper, we prove that  X  is invariantly  complemented in   if and  only if  the left ideal  of    has a bounded approximate identity. We also prove that a foundation semigroup with identity S is left amenab...

متن کامل

A Dynamic Parametrization Scheme for Shape Optimization Using Quasi-Newton Methods

A variable parametrization scheme is developed and demonstrated for shape optimization using quasi-Newton methods. The scheme performs adaptive parametrization refinement while preserving the approximate Hessian of the shape optimization problem and enables free-form shape design using quasi-Newton optimization methods. Using a Bspline parametrization, the scheme is validated using a 1-D shape ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Optimization Methods and Software

دوره 25  شماره 

صفحات  -

تاریخ انتشار 2010